Relations between entropy and error probability
نویسندگان
چکیده
[4] I. CsiszAr and J. Komer, Information Theory: Coding Theorems for Discrete Memoryless Systems. New York Academic, 198 1. [5] R. G. Gallager, Information Theory and Reliable Communication. New York: Wiley. 1968. [6] A. I. Viterbi and J. K. Omura, Principles of Digital Communication and Coding. New York McGraw-Hill, 1979. [7] W. H. R. Equitz and T. M. Cover, “Successive refinement of information,” IEEE Trans. Inform. Theory, vol. 37, pp. 269-274, Mar. 1991. [8] H. S. Witsenhausen and A. D. Wyner, “Source coding for multiple description Ii: A binary source,” Bell Syst. Tech. J. , vol. 60, no. 10, pp. 2281-2292, Dec. 1981. [9] J. K. Wolf, A. D. Wyner, and 3. Ziv, “Source coding for multiple description,” Bell Syst. Tech. J., vol. 60, no. 10, pp. 2281-2292, Dec. 1981. [lo] L. Ozarow, “On a source-coding problem with two channels and three receivers,” Bell Syst. Tech. J., vol. 59, no. 10, pp. 1909-1921, Dec. 1980. [ 111 A. El Gamal and T. Cover, “Achievable rates for multiple descriptions,” IEEE Trans. Inform. Theory, vol. IT-28, pp. 851-857, Nov. 1982. [I21 Z. Zhang and T. Berger, “New results in binary multiple descriptions,” IEEE Trans. Inform. Theory, vol. IT-33, pp. 502-521, July 1987. [13] R. Ahlswede, ‘The rate-distortion region for multiple descriptions without excess rate,” IEEE Truns. Inform. Theory, vol. IT-31, pp.
منابع مشابه
Analytical Bounds between Entropy and Error Probability in Binary Classifications
The existing upper and lower bounds between entropy and error probability are mostly derived from the inequality of the entropy relations, which could introduce approximations into the analysis. We derive analytical bounds based on the closed-form solutions of conditional entropy without involving any approximation. Two basic types of classification errors are investigated in the context of bin...
متن کاملAn Optimization Approach of Deriving Bounds between Entropy and Error from Joint Distribution: Case Study for Binary Classifications
In this work, we propose a new approach of deriving the bounds between entropy and error from a joint distribution through an optimization means. The specific case study is given on binary classifications. Two basic types of classification errors are investigated, namely, the Bayesian and non-Bayesian errors. The consideration of non-Bayesian errors is due to the facts that most classifiers res...
متن کاملVelocity Distribution in the 90-degree Bend based on the Probability and Entropy Concept
Practical concept of velocity distribution of pressure flow in the bends is interesting and hence, the professional engineering design has been investigated in the current study. This paper shows that velocity distribution in the bends can be analyzed in terms of the probability distributions. The concept of entropy based on the probability is an applied and new approach to achieve velocity pro...
متن کاملTaylor Expansion for the Entropy Rate of Hidden Markov Chains
We study the entropy rate of a hidden Markov process, defined by observing the output of a symmetric channel whose input is a first order Markov process. Although this definition is very simple, obtaining the exact amount of entropy rate in calculation is an open problem. We introduce some probability matrices based on Markov chain's and channel's parameters. Then, we try to obtain an estimate ...
متن کاملA Preferred Definition of Conditional Rényi Entropy
The Rényi entropy is a generalization of Shannon entropy to a one-parameter family of entropies. Tsallis entropy too is a generalization of Shannon entropy. The measure for Tsallis entropy is non-logarithmic. After the introduction of Shannon entropy , the conditional Shannon entropy was derived and its properties became known. Also, for Tsallis entropy, the conditional entropy was introduced a...
متن کاملThe Maximum Error Probability Criterion, Random Encoder, and Feedback, in Multiple Input Channels
For a multiple input channel, one may define different capacity regions, according to the criterions of error, types of codes, and presence of feedback. In this paper, we aim to draw a complete picture of relations among these different capacity regions. To this end, we first prove that the average-error-probability capacity region of a multiple input channel can be achieved by a random code un...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- IEEE Trans. Information Theory
دوره 40 شماره
صفحات -
تاریخ انتشار 1994